Current:Home > Finance'Open the pod bay door, HAL' — here's how AI became a movie villain -ValueCore
'Open the pod bay door, HAL' — here's how AI became a movie villain
View
Date:2025-04-17 02:53:23
This article was written by a human.
That's worth mentioning because it's no longer something you can just assume. Artificial intelligence that can mimic conversation, whether written or spoken, has been in the news a lot this year, delighting some members of the public while worrying educators, politicians, the World Health Organization, and even some of the people developing AI technology.
Misuse of AI is part of what actors and writers are striking about in Hollywood, and the threat of AI is something Hollywood was imagining long before it was real.
In 1968, for instance, the year before humans first set foot on the moon — and a time when astronauts still used pencils and slide rules to calculate re-entry trajectories because their space capsules had less computing power than a digital watch has today — Stanley Kubrick introduced movie audiences to a sentient HAL-9000 computer in 2001: A Space Odyssey.
HAL (for Heuristically Programmed Algorithmic Computer) introduced itself early in the film by saying, "No 9000 computer has ever made a mistake or distorted information. We are all, by any practical definition of the words, foolproof and incapable of error."
'Open the pod bay door, HAL'
So why was HAL acting so strangely? He (it?) was responsible for maintaining all aspects of a months-long space flight, ferrying astronauts to the moons of Jupiter. Programmed to run the mission flawlessly, the computer's behavior had become alarming, and two of the astronauts had decided to shut down some of its functions. Their plan was short-circuited when HAL, lip-reading a conversation they'd managed to keep him from hearing, cast one of them adrift while he was outside the ship repairing an antenna and refused to let the other back on board.
"Open the pod bay door, HAL" became one of the most quoted film lines of the decade when the computer responded, "I'm sorry, Dave, I'm afraid I can't do that. This mission is too important for me to allow you to jeopardize it."
It's hard to articulate what a genuine shock this was for 1960s movie audiences. There'd been films with, say, robots causing havoc, but they were generally robots doing someone else's bidding. Movie robots, at that point, were about brawn, not brain.
And anyway, malevolent robot stories were precisely the sort of B-movie silliness Kubrick was trying to avoid. So his intelligent machine simply observed (with an unblinking red eye) and, when addressed directly, spoke with a calm, modulated voice, not unlike the one that would be adopted four decades later by Siri and Alexa.
Darwin Among the Machines
Earlier literary notions of "artificial" intelligence — and there were not a lot of them at that point — hadn't really caught the public's imagination. Samuel Butler's 1863 article Darwin Among the Machines, is generally thought to be the origin of this species of writing, and it mostly just notes that while humankind invented machines to assist us — and remember, a really sophisticated machine in 1863 was the steam locomotive — we were increasingly assisting them: tending, fueling, repairing.
Over tens of thousands of years, Butler wondered, might humans not evolve in much the same way Darwin's study of natural selection had just established the rest of the plant and animal kingdoms do, to the point that we would become dependent on our devices?
But even when he incorporated that idea a decade later into a satirical novel called Erewhon, expounding for several chapters on self-replicating machines, Butler barely touched on the notion that those machines would develop consciousness. And neither did the influential 19th-century science fiction writers who followed him. H.G. Wells and Jules Verne invented plenty of unorthodox devices as they sent characters to the center of the Earth, and into space and the recesses of time, without ever considering that those devices might want to do things on their own.
The term "artificial intelligence" wasn't even coined (by American computer scientist John McCarthy) until about a dozen years before Kubrick made his Space Odyssey. But HAL made an impression on the public where scientists had not. Within just a couple of years, movie computers didn't just want spaceship domination; in Colossus: The Forbin Project (1970), they wanted to take over the world.
Malignant machines gone viral
And then this notion of technology-run-wild, ran wild. A high school student played by Matthew Broderick nearly started World War III in WarGames (1983) when he thought he was hacking a computer company's website but accidentally challenged the Pentagon's defense network to a quick game of "global thermonuclear war." The problem, it soon became clear, was that no one told the defense network they were just "playing."
Elsewhere, mechanical men stopped being all-brawn and got a new dispensation to think for themselves, something fiction had granted them before Hollywood got around to it.
In the 1940s, sci-fi novelist Isaac Asimov came up with "Three Laws of Robotics" that would theoretically keep "independent" machines in line. When Asimov's story I, Robot, was turned into a film a half-century or so later, those laws should have reassured Will Smith as he stared down thousands of bots. But he had good reason to be skeptical; he was fighting a robot rebellion.
The Terminator movies effectively put all these themes on steroids — cyborgs in the service of a computerized, sentient, civil-defense network called Skynet, designed to function without any human input. A "Nuclear Fire" and three billion human deaths later, what was left of humanity was engaged in a war against the machines that has so far consumed six films, a TV series, a pair of web series, and innumerable games.
And nuclear blasts weren't necessary to make machine intelligence alarming, a fact cyberpunk-noir established definitively in Blade Runner with its "replicants," and in a Matrix series that reduced all of humanity to a mere power source for machines.
Hollywood's still fighting that vision. Who knows what "The Entity" wants in Mission Impossible: Dead Reckoning (presumably we'll find out next year in Part Two), but whatever it is, it won't bode well for humanity.
It seems not to have occurred to Tinseltown that AI might do the things it's actually doing — make social media dangerous, or make undergrad writing courses unteachable, or screw up relationships by auto-completing incorrectly. None of those are terribly cinematic, so Hollywood concentrates on exploiting our fears — in the late 20th century, we worried about ceding control to technology. In the 21st century, we worry about losing control of technology.
Bring on the droids
Have there also been friendlier film visions of AI? Sure. George Lucas came up with lovable droids R2-D2 and C-3PO for Star Wars, and Pixar gave us Wall-E, a bot who was pluckily determined to clean up an entire planet we'd despoiled.
Spike Jonze's drama Her imagined a sentient, Siri-like personal assistant as a digital girlfriend. Star Trek's Data was not just a Next Generation android version of Mr. Spock, but also a sort of emotion-challenged Pinocchio.
And another Pinocchio — this one fashioned to stand the test of time — would have been Stanley Kubrick's own answer to the question he'd posed with HAL in 1968.
Kubrick labored for decades to hone the script for A.I. Artificial Intelligence, then just two years before he died, handed the project off to Steven Spielberg — the story of David, a robot child who has been programmed to love, and who ends up going beyond that programming.
"Until you were born," William Hurt's Professor Hobby told the bionic child he'd modeled on his own son, "robots didn't dream, robots didn't desire unless we told them what to want." The miracle, he went on, was that though David was engineered rather than born, he shared with humans "the ability to chase down our dreams...something no machine has ever done, until you."
That may not have been enough to make David a real boy, but it put a gentle face on what is perhaps our greatest fear about AI – that we are mortal, and it is not.
In the film, David outlives all of humanity, never growing up, never changing. And perhaps because he was played by Haley Joel Osment, or perhaps because Spielberg was calling the shots, or perhaps because the music swelled ... just so — it didn't feel the least bit threatening.
veryGood! (99562)
Related
- DeepSeek: Did a little known Chinese startup cause a 'Sputnik moment' for AI?
- Kyle Richards Swears These Shoes Are So Comfortable, It Feels Like She’s Barefoot
- 'Pure electricity': Royals on verge of MLB playoff series win after Cole Ragans gem
- Massachusetts couple charged with casting ballots in New Hampshire
- Which apps offer encrypted messaging? How to switch and what to know after feds’ warning
- Kylie Jenner Makes Paris Fashion Week Modeling Debut in Rare Return to Runway
- Man gets nearly 2-year prison sentence in connection with arson case at Grand Canyon National Park
- Killer Whales in Chile Have Begun Preying on Dolphins. What Does It Mean?
- Friday the 13th luck? 13 past Mega Millions jackpot wins in December. See top 10 lottery prizes
- Opinion: One missed field goal keeps Georgia's Kirby Smart from being Ohio State's Ryan Day
Ranking
- Off the Grid: Sally breaks down USA TODAY's daily crossword puzzle, Triathlon
- Killer Whales in Chile Have Begun Preying on Dolphins. What Does It Mean?
- Early reaction to Utah Hockey Club is strong as it enters crowded Salt Lake market
- Kylie Jenner walks the runway wearing princess gown in Paris Fashion Week debut
- 'Kraven the Hunter' spoilers! Let's dig into that twisty ending, supervillain reveal
- Maryland approves settlement in state police discrimination case
- ChatGPT maker OpenAI raises $6.6 billion in fresh funding as it moves away from its nonprofit roots
- Sarah Paulson on the rigors of 'Hold Your Breath' and being Holland Taylor's Emmy date
Recommendation
Megan Fox's ex Brian Austin Green tells Machine Gun Kelly to 'grow up'
Travis Kelce Reacts to Making Chiefs History
Watch a sailor's tears at a surprise welcome home from her dad
Driver fatigue likely led to Arizona crash that killed 2 bicyclists and injured 14, NTSB says
'We're reborn!' Gazans express joy at returning home to north
Analyzing Alabama-Georgia and what it means, plus Week 6 predictions lead College Football Fix
Man gets nearly 2-year prison sentence in connection with arson case at Grand Canyon National Park
Omaha officer followed policy when he fatally shot fleeing man 8 times, police chief says